skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Search for: All records

Creators/Authors contains: "Liu, Daogao"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Free, publicly-accessible full text available December 1, 2025
  2. This paper studies differentially private stochastic convex optimization (DP-SCO) in the presence of heavy-tailed gradients, where only a 𝑘 kth-moment bound on sample Lipschitz constants is assumed, instead of a uniform bound. The authors propose a reduction-based approach that achieves the first near-optimal error rates (up to logarithmic factors) in this setting. Specifically, under ( 𝜖 , đ›ŋ ) (Īĩ,δ)-approximate differential privacy, they achieve an error bound of đē 2 𝑛 + đē 𝑘 ⋅ ( 𝑑 𝑛 𝜖 ) 1 − 1 𝑘 , n ​ G 2 ​ ​ +G k ​ ⋅( nĪĩ d ​ ​ ) 1− k 1 ​ , up to a mild polylogarithmic factor in 1 đ›ŋ δ 1 ​ , where đē 2 G 2 ​ and đē 𝑘 G k ​ are the 2nd and 𝑘 kth moment bounds on sample Lipschitz constants. This nearly matches the lower bound established by Lowy and Razaviyayn (2023). Beyond the basic result, the authors introduce a suite of private algorithms that further improve performance under additional assumptions: an optimal algorithm under a known-Lipschitz constant, a near-linear time algorithm for smooth functions, and an optimal linear-time algorithm for smooth generalized linear models. 
    more » « less